SNIWD: Simultaneous Weight Noise Injection with Weight Decay for MLP Training
نویسندگان
چکیده
Despite noise injecting during training has been demonstrated with success in enhancing the fault tolerance of neural network, theoretical analysis on the dynamic of this noise injection-based online learning algorithm has far from complete. In particular, the convergence proofs for those algorithms have not been shown. In this regards, this paper presents an empirical study on the non-convergence properties of injecting weight noises during training a multilayer perceptron, and an online learning algorithm called SNIWD (simultaneous noise injection and weight decay) to overcome such non-convergence problem. Simulation results show that SNIWD is able to improve the convergence and enforce small magnitude on the network parameters (input weights, input biases and output weights). Moreover, SNIWD is able to make the network have similar fault tolerance ability as using pure noise injection approach.
منابع مشابه
Convergence analysis of on-line weight noise injection training algorithms for MLP networks
Injecting weight noise during training has been proposed for almost two decades as a simple technique to improve fault tolerance and generalization of a multilayer perceptron (MLP). However, little has been done regarding their convergence behaviors. Therefore, we presents in this paper the convergence proofs of two of these algorithms for MLPs. One is based on combining injecting multiplicativ...
متن کاملEmpirical studies on weight noise injection based online learning algorithms
While weight noise injection during training has been adopted in attaining fault tolerant neural networks (NNs), theoretical and empirical studies on the online algorithms developed based on these strategies have yet to be complete. In this paper, we present results on two important aspects in online learning algorithms based on combining weight noise injection and weight decay. Through intensi...
متن کاملNote on Weight Noise Injection During Training a MLP
Although many analytical works have been done to investigate the change of prediction error of a trained NN if its weights are injected by noise, seldom of them has truly investigated on the dynamical properties (such as objective functions and convergence behavior) of injecting weight noise during training. In this paper, four different online weight noise injection training algorithms for mul...
متن کاملOn Weight-Noise-Injection Training
While injecting weight noise during training has been proposed for more than a decade to improve the convergence, generalization and fault tolerance of a neural network, not much theoretical work has been done to its convergence proof and the objective function that it is minimizing. By applying the Gladyshev Theorem, it is shown that the convergence of injecting weight noise during training an...
متن کاملEnhanced MLP performance and fault tolerance resulting from synaptic weight noise during training
We analyze the effects of analog noise on the synaptic arithmetic during multilayer perceptron training, by expanding the cost function to include noise-mediated terms. Predictions are made in the light of these calculations that suggest that fault tolerance, training quality and training trajectory should be improved by such noise-injection. Extensive simulation experiments on two distinct cla...
متن کامل